Mean field inference for the Dirichlet process mixture model
نویسندگان
چکیده
منابع مشابه
Distributed Inference for Dirichlet Process Mixture Models
Bayesian nonparametric mixture models based on the Dirichlet process (DP) have been widely used for solving problems like clustering, density estimation and topic modelling. These models make weak assumptions about the underlying process that generated the observed data. Thus, when more data are collected, the complexity of these models can change accordingly. These theoretical properties often...
متن کاملAdaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models
We develop a sequential low-complexity inference procedure for the Infinite Gaussian Mixture Model (IGMM) for the general case of an unknown mean and covariance. The observations are sequentially allocated to classes based on a sequential maximum a-posterior (MAP) criterion. We present an easily computed, closed form for the conditional likelihood, in which the parameters can be recursively upd...
متن کاملVariational Inference for Beta-Bernoulli Dirichlet Process Mixture Models
A commonly used paradigm in diverse application areas is to assume that an observed set of individual binary features is generated from a Bernoulli distribution with probabilities varying according to a Beta distribution. In this paper, we present our nonparametric variational inference algorithm for the Beta-Bernoulli observation model. Our primary focus is clustering discrete binary data usin...
متن کاملThe Dirichlet Process Mixture (DPM) Model
The Dirichlet distribution forms our first step toward understanding the DPM model. The Dirichlet distribution is a multi-parameter generalization of the Beta distribution and defines a distribution over distributions, i.e. the result of sampling a Dirichlet is a distribution on some discrete probability space. Let Θ = {θ1,θ2, . . . ,θn} be a probability distribution on the discrete space = { 1...
متن کاملMemoized Online Variational Inference for Dirichlet Process Mixture Models
Variational inference algorithms provide the most effective framework for largescale training of Bayesian nonparametric models. Stochastic online approaches are promising, but are sensitive to the chosen learning rate and often converge to poor local optima. We present a new algorithm, memoized online variational inference, which scales to very large (yet finite) datasets while avoiding the com...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Statistics
سال: 2009
ISSN: 1935-7524
DOI: 10.1214/08-ejs339